Inexact proximal Newton methods for self-concordant functions
نویسندگان
چکیده
We analyze the proximal Newton method for minimizing a sum of a self-concordant function and a convex function with an inexpensive proximal operator. We present new results on the global and local convergence of the method when inexact search directions are used. The method is illustrated with an application to L1-regularized covariance selection, in which prior constraints on the sparsity pattern of the inverse covariance matrix are imposed. In the numerical experiments the proximal Newton steps are computed by an accelerated proximal gradient method, and multifrontal algorithms for positive definite matrices with chordal sparsity patterns are used to evaluate gradients and matrix-vector products with the Hessian of the smooth component of the objective.
منابع مشابه
Generalized Self-Concordant Functions: A Recipe for Newton-Type Methods
We study the smooth structure of convex functions by generalizing a powerful concept so-called self-concordance introduced by Nesterov and Nemirovskii in the early 1990s to a broader class of convex functions, which we call generalized self-concordant functions. This notion allows us to develop a unified framework for designing Newton-type methods to solve convex optimization problems. The prop...
متن کاملA robust Kantorovich's theorem on the inexact Newton method with relative residual error tolerance
We prove that under semi-local assumptions, the inexact Newton method with a fixed relative residual error tolerance converges Q-linearly to a zero of the non-linear operator under consideration. Using this result we show that Newton method for minimizing a self-concordant function or to find a zero of an analytic function can be implemented with a fixed relative residual error tolerance. In th...
متن کاملCommunication-Efficient Distributed Optimization of Self-Concordant Empirical Loss
We consider distributed convex optimization problems originated from sample average approximation of stochastic optimization, or empirical risk minimization in machine learning. We assume that each machine in the distributed computing system has access to a local empirical loss function, constructed with i.i.d. data sampled from a common distribution. We propose a communication-efficient distri...
متن کاملDiSCO: Distributed Optimization for Self-Concordant Empirical Loss
We propose a new distributed algorithm for empirical risk minimization in machine learning. The algorithm is based on an inexact damped Newton method, where the inexact Newton steps are computed by a distributed preconditioned conjugate gradient method. We analyze its iteration complexity and communication efficiency for minimizing self-concordant empirical loss functions, and discuss the resul...
متن کاملConvergence analysis of inexact proximal Newton-type methods
We study inexact proximal Newton-type methods to solve convex optimization problems in composite form: minimize x∈Rn f(x) := g(x) + h(x), where g is convex and continuously differentiable and h : R → R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. Proximal Newton-type methods require the solution of subproblems to obtain the search ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Math. Meth. of OR
دوره 85 شماره
صفحات -
تاریخ انتشار 2017